Goto

Collaborating Authors

 learning feature sparse principal subspace


Learning Feature Sparse Principal Subspace

Neural Information Processing Systems

This paper presents new algorithms to solve the feature-sparsity constrained PCA problem (FSPCA), which performs feature selection and PCA simultaneously. Existing optimization methods for FSPCA require data distribution assumptions and are lack of global convergence guarantee. Though the general FSPCA problem is NP-hard, we show that, for a low-rank covariance, FSPCA can be solved globally (Algorithm 1). Then, we propose another strategy (Algorithm 2) to solve FSPCA for the general covariance by iteratively building a carefully designed proxy. We prove (data-dependent) approximation bound and convergence guarantees for the new algorithms. For the spectrum of covariance with exponential/Zipf's distribution, we provide exponential/posynomial approximation bound. Experimental results show the promising performance and efficiency of the new algorithms compared with the state-of-the-arts on both synthetic and real-world datasets.


Review for NeurIPS paper: Learning Feature Sparse Principal Subspace

Neural Information Processing Systems

Strengths: The main contributions are novel and original, to my knowledge. The global optimality of alg.1 is very interesting and somehow surprising. Given the known hardness results of the FSPCA problem, the theorem (4.1) characterizes a subclass of problems that could be perfectly solvable. For high-rank setting, they provide a new iterative minorization-maximization procedure alg.2 by solving a low-rank covariance with alg.1. I found the MM construction here novel as existing results mostly use power method type procedure as the main algorithmic framework.

  alg, learning feature sparse principal subspace, neurips paper, (4 more...)

Learning Feature Sparse Principal Subspace

Neural Information Processing Systems

This paper presents new algorithms to solve the feature-sparsity constrained PCA problem (FSPCA), which performs feature selection and PCA simultaneously. Existing optimization methods for FSPCA require data distribution assumptions and are lack of global convergence guarantee. Though the general FSPCA problem is NP-hard, we show that, for a low-rank covariance, FSPCA can be solved globally (Algorithm 1). Then, we propose another strategy (Algorithm 2) to solve FSPCA for the general covariance by iteratively building a carefully designed proxy. We prove (data-dependent) approximation bound and convergence guarantees for the new algorithms.